Single-tree GMM training

نویسنده

  • Ryan R. Curtin
چکیده

In this short document, we derive a tree-independent single-tree algorithm for Gaussian mixture model training, based on a technique proposed by Moore [8]. Here, the solution we provide is tree-independent and thus will work with any type of tree and any type of traversal; this is more general than Moore’s original formulation, which was limited to mrkd-trees. This allows us to develop a flexible, generic implementation for GMM training of the type commonly found in the mlpack machine learning library [3]. A better introduction to Gaussian mixture models, their uses, and their training is given by both [9] and [2]; readers unfamiliar with GMMs should consult those references, as this minor discussion is intended as more of a refresher and also for terminology establishment. Before describing the single-tree algorithm, assume that we are given a dataset S = {p0, p1, . . . , pn}, and we wish to fit a Gaussian mixture model with m components to this data. Each component in our Gaussian mixture model θ is described as cj = (φj , μj ,Σj) for j ∈ [0,m), where φj = P (cj |θ) is the mixture weight of component j, μj is the mean of component j, and Σj is the covariance of component j. Then, the probability of a point arising from the GMM θ may be calculated as

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Gmm-free Dnn Training

While deep neural networks (DNNs) have become the dominant acoustic model (AM) for speech recognition systems, they are still dependent on Gaussian mixture models (GMMs) for alignments both for supervised training and for context dependent (CD) tree building. Here we explore bootstrapping DNN AM training without GMM AMs and show that CD trees can be built with DNN alignments which are better ma...

متن کامل

Towards a more efficient SVM supervector speaker verification system using Gaussian reduction and a tree-structured hash

Speaker verification (SV) systems that employ maximum a posteriori (MAP) adaptation of a Gaussian mixture model (GMM) universal background model (UBM) incur a significant teststage computational load in the calculation of a posteriori probabilities and sufficient statistics. We propose a multi-layered hash system employing a tree-structured GMM which uses Runnalls’ GMM reduction technique. The ...

متن کامل

Regularized nonnegative matrix factorization using Gaussian mixture priors for supervised single channel source separation

We introduce a new regularized nonnegative matrix factorization (NMF) method for supervised single-channel source separation (SCSS). We propose a new multi-objective cost function which includes the conventional divergence term for the NMF together with a prior likelihood term. The first term measures the divergence between the observed data and the multiplication of basis and gains matrices. T...

متن کامل

مطالعات درخت تصمیم در برآورد ریسک ابتلا به سرطان سینه با استفاده از چند شکلی‌های تک نوکلوئیدی

Abstract Introduction:   Decision tree is the data mining tools to collect, accurate prediction and sift information from massive amounts of data that are used widely in the field of computational biology and bioinformatics. In bioinformatics can be predict on diseases, including breast cancer. The use of genomic data including single nucleotide polymorphisms is a very important ...

متن کامل

Memory-Based Approximation of the Gaussian Mixture Model Framework for Bandwidth Extension of Narrowband Speech

In this paper, we extend our previous work on exploiting speech temporal properties to improve Bandwidth Extension (BWE) of narrowband speech using Gaussian Mixture Models (GMMs). By quantifying temporal properties through information theoretic measures and using delta features, we have shown that narrowband memory significantly increases certainty about highband parameters. However, as delta f...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015